Goto

Collaborating Authors

 drift function


DiscoveredPolicyOptimisation

Neural Information Processing Systems

Most of these advancements came through the continual development of new algorithms, which were designed using a combination of mathematical derivations, intuitions, and experimentation. Such an approach of creating algorithms manually is limited by human understanding and ingenuity.


Plug-In Classification of Drift Functions in Diffusion Processes Using Neural Networks

Zhao, Yuzhen, Fan, Jiarong, Liu, Yating

arXiv.org Machine Learning

We study a supervised multiclass classification problem for diffusion processes, where each class is characterized by a distinct drift function and trajectories are observed at discrete times. Extending the one-dimensional multiclass framework of Denis et al. (2024) to multidimensional diffusions, we propose a neural network-based plug-in classifier that estimates the drift functions for each class from independent sample paths and assigns labels based on a Bayes-type decision rule. Under standard regularity assumptions, we establish convergence rates for the excess misclassification risk, explicitly capturing the effects of drift estimation error and time discretization. Numerical experiments demonstrate that the proposed method achieves faster convergence and improved classification performance compared to Denis et al. (2024) in the one-dimensional setting, remains effective in higher dimensions when the underlying drift functions admit a compositional structure, and consistently outperforms direct neural network classifiers trained end-to-end on trajectories without exploiting the diffusion model structure.


Approximate Gaussian process inference for the drift function in stochastic differential equations

Neural Information Processing Systems

We introduce a nonparametric approach for estimating drift functions in systems of stochastic differential equations from incomplete observations of the state vector. Using a Gaussian process prior over the drift as a function of the state vector, we develop an approximate EM algorithm to deal with the unobserved, latent dynamics between observations. The posterior over states is approximated by a piecewise linearized process and the MAP estimation of the drift is facilitated by a sparse Gaussian process regression.


Approximate Gaussian process inference for the drift function in stochastic differential equations

Andreas Ruttor, Philipp Batz, Manfred Opper

Neural Information Processing Systems

We introduce a nonparametric approach for estimating drift functions in systems of stochastic differential equations from sparse observat ions of the state vector. Using a Gaussian process prior over the drift as a function of the state vector, we develop an approximate EM algorithm to deal with the unobser ved, latent dynamics between observations. The posterior over states is appr oximated by a piecewise linearized process of the Ornstein-Uhlenbeck type and the M AP estimation of the drift is facilitated by a sparse Gaussian process regressio n.


Drift Estimation for Diffusion Processes Using Neural Networks Based on Discretely Observed Independent Paths

Zhao, Yuzhen, Liu, Yating, Hoffmann, Marc

arXiv.org Machine Learning

This paper addresses the nonparametric estimation of the drift function over a compact domain for a time-homogeneous diffusion process, based on high-frequency discrete observations from $N$ independent trajectories. We propose a neural network-based estimator and derive a non-asymptotic convergence rate, decomposed into a training error, an approximation error, and a diffusion-related term scaling as ${\log N}/{N}$. For compositional drift functions, we establish an explicit rate. In the numerical experiments, we consider a drift function with local fluctuations generated by a double-layer compositional structure featuring local oscillations, and show that the empirical convergence rate becomes independent of the input dimension $d$. Compared to the $B$-spline method, the neural network estimator achieves better convergence rates and more effectively captures local features, particularly in higher-dimensional settings.



Approximate Gaussian process inference for the drift function in stochastic differential equations

Andreas Ruttor, Philipp Batz, Manfred Opper

Neural Information Processing Systems

We introduce a nonparametric approach for estimating drift functions in systems of stochastic differential equations from sparse observat ions of the state vector. Using a Gaussian process prior over the drift as a function of the state vector, we develop an approximate EM algorithm to deal with the unobser ved, latent dynamics between observations. The posterior over states is appr oximated by a piecewise linearized process of the Ornstein-Uhlenbeck type and the M AP estimation of the drift is facilitated by a sparse Gaussian process regressio n.


Nonparametric learning of stochastic differential equations from sparse and noisy data

Ganguly, Arnab, Mitra, Riten, Zhou, Jinpu

arXiv.org Machine Learning

The paper proposes a systematic framework for building data-driven stochastic differential equation (SDE) models from sparse, noisy observations. Unlike traditional parametric approaches, which assume a known functional form for the drift, our goal here is to learn the entire drift function directly from data without strong structural assumptions, making it especially relevant in scientific disciplines where system dynamics are partially understood or highly complex. We cast the estimation problem as minimization of the penalized negative log-likelihood functional over a reproducing kernel Hilbert space (RKHS). In the sparse observation regime, the presence of unobserved trajectory segments makes the SDE likelihood intractable. To address this, we develop an Expectation-Maximization (EM) algorithm that employs a novel Sequential Monte Carlo (SMC) method to approximate the filtering distribution and generate Monte Carlo estimates of the E-step objective. The M-step then reduces to a penalized empirical risk minimization problem in the RKHS, whose minimizer is given by a finite linear combination of kernel functions via a generalized representer theorem. To control model complexity across EM iterations, we also develop a hybrid Bayesian variant of the algorithm that uses shrinkage priors to identify significant coefficients in the kernel expansion. We establish important theoretical convergence results for both the exact and approximate EM sequences. The resulting EM-SMC-RKHS procedure enables accurate estimation of the drift function of stochastic dynamical systems in low-data regimes and is broadly applicable across domains requiring continuous-time modeling under observational constraints. We demonstrate the effectiveness of our method through a series of numerical experiments.


Empirical risk minimization algorithm for multiclass classification of S.D.E. paths

Denis, Christophe, Mintsa, Eddy Ella

arXiv.org Machine Learning

Functional data analysis (Ramsay & Silverman, 2005) is an active field of research, as technological progress has facilitated the large-scale collection of such data in a wide range of fields, including biology (Crow, 2017)), physics (Romanczuk et al., 2012)), and finance (Lamberton & Lapeyre, 2011)). In particular, the classification of functional data is at the core of recent research efforts (Ismail Fawaz et al., 2019; Xiao et al., 2022). A specific case arises when the data are assumed to be generated by diffusion processes. The literature on statistical methods for stochastic differential equations has generated significant attention.


Learning Stochastic Dynamical Systems with Structured Noise

Guo, Ziheng, Greene, James, Zhong, Ming

arXiv.org Machine Learning

Stochastic differential equations (SDEs) are a ubiquitous modeling framework that finds applications in physics, biology, engineering, social science, and finance. Due to the availability of large-scale data sets, there is growing interest in learning mechanistic models from observations with stochastic noise. In this work, we present a nonparametric framework to learn both the drift and diffusion terms in systems of SDEs where the stochastic noise is singular. Specifically, inspired by second-order equations from classical physics, we consider systems which possess structured noise, i.e. noise with a singular covariance matrix. We provide an algorithm for constructing estimators given trajectory data and demonstrate the effectiveness of our methods via a number of examples from physics and biology. As the developed framework is most naturally applicable to systems possessing a high degree of dimensionality reduction (i.e. symmetry), we also apply it to the high dimensional Cucker-Smale flocking model studied in collective dynamics and show that it is able to accurately infer the low dimensional interaction kernel from particle data.